Minimax estimation of norms of a probability density: I. Lower bounds

نویسندگان

چکیده

The paper deals with the problem of nonparametric estimating Lp–norm, p∈(1,∞), a probability density on Rd, d≥1 from independent observations. unknown is assumed to belong ball in anisotropic Nikolskii’s space. We adopt minimax approach, and derive lower bounds risk. In particular, we demonstrate that accuracy estimation procedures essentially depends whether p integer or not. Moreover, develop general technique for derivation risk problems nonlinear functionals. proposed applicable broad class functionals, it used Lp–norm estimation.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Estimation of a k-monotone density: characterizations, consistency and minimax lower bounds.

The classes of monotone or convex (and necessarily monotone) densities on ℝ(+) can be viewed as special cases of the classes of k-monotone densities on ℝ(+). These classes bridge the gap between the classes of monotone (1-monotone) and convex decreasing (2-monotone) densities for which asymptotic results are known, and the class of completely monotone (∞-monotone) densities on ℝ(+). In this pap...

متن کامل

Minimax lower bounds

Now that we have a good handle on the performance of ERM and its variants, it is time to ask whether we can do better. For example, consider binary classification: we observe n i.i.d. training samples from an unknown joint distribution P on X× {0,1}, where X is some feature space, and for a fixed class F of candidate classifiers f :X→ {0,1} we let f̂n be the ERM solution f̂n = argmin f ∈F 1 n n ∑...

متن کامل

Minimax Lower Bounds

Minimax Lower Bounds Adityanand Guntuboyina 2011 This thesis deals with lower bounds for the minimax risk in general decision-theoretic problems. Such bounds are useful for assessing the quality of decision rules. After providing a unified treatment of existing techniques, we prove new lower bounds which involve f -divergences, a general class of dissimilarity measures between probability measu...

متن کامل

Local Privacy and Minimax Bounds: Sharp Rates for Probability Estimation

We provide a detailed study of the estimation of probability distributions— discrete and continuous—in a stringent setting in which data is kept private even from the statistician. We give sharp minimax rates of convergence for estimation in these locally private settings, exhibiting fundamental trade-offs between privacy and convergence rate, as well as providing tools to allow movement along ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Bernoulli

سال: 2022

ISSN: ['1573-9759', '1350-7265']

DOI: https://doi.org/10.3150/21-bej1380